Group Importance Sampling for Particle Filtering and MCMC

نویسندگان

  • Luca Martino
  • Victor Elvira
  • Gustau Camps-Valls
چکیده

Importance Sampling (IS) is a well-known Monte Carlo technique that approximates integrals involving a posterior distribution by means of weighted samples. In this work, we study the assignation of a single weighted sample which compresses the information contained in a population of weighted samples. Part of the theory that we present as Group Importance Sampling (GIS) has been employed implicitly in different works in the literature. The provided analysis yields several theoretical and practical consequences. For instance, we discuss the application of GIS into the Sequential Importance Resampling framework and show that Independent Multiple Try Metropolis schemes can be interpreted as a standard Metropolis-Hastings algorithm, following the GIS approach. We also introduce two novel Markov Chain Monte Carlo (MCMC) techniques based on GIS. The first one, named Group Metropolis Sampling method, produces a Markov chain of sets of weighted samples. All these sets are then employed for obtaining a unique global estimator. The second one is the Distributed Particle Metropolis-Hastings technique, where different parallel particle filters are jointly used to drive an MCMC algorithm. Different resampled trajectories are compared and then tested with a proper acceptance probability. The novel schemes are tested in different numerical experiments such as learning the hyperparameters of Gaussian Processes, the localization problem in a wireless sensor network and the tracking of vegetation parameters given satellite observations, where they are compared with several benchmark Monte Carlo techniques. Three illustrative Matlab demos are also provided.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Population Based Particle Filtering

This paper proposes a novel particle filtering strategy by combining population Monte Carlo Markov chain methods with sequential Monte Carlo chain particle which we call evolving population Monte Carlo Markov Chain (EP MCMC) filtering. Iterative convergence on groups of particles (populations) is obtained using a specified kernel moving particles toward more likely regions. The proposed techniq...

متن کامل

The Unscented Particle Filter

In this paper, we propose a new particle filter based on sequential importance sampling. The algorithm uses a bank of unscented filters to obtain the importance proposal distribution. This proposal has two very "nice" properties. Firstly, it makes efficient use of the latest available information and, secondly, it can have heavy tails. As a result, we find that the algorithm outperforms standar...

متن کامل

Improvement Strategies for Monte Carlo Particle Filters

The particle filtering field has seen an upsurge in interest over recent years and accompanying this a number of enhancements to the basic techniques have been suggested in the literature. In this paper we collect together a group of these developments which seem to be particularly important for time series applications and give a broad discussion of the methods, showing the interrelationships ...

متن کامل

Video analysis-based vehicle detection and tracking using an MCMC sampling framework

This article presents a probabilistic method for vehicle detection and tracking through the analysis of monocular images obtained from a vehicle-mounted camera. The method is designed to address the main shortcomings of traditional particle filtering approaches, namely Bayesian methods based on importance sampling, for use in traffic environments. These methods do not scale well when the dimens...

متن کامل

Adaptive importance sampling in signal processing

In Bayesian signal processing, all the information about the unknowns of interest is contained in their posterior distributions. The unknowns can be parameters of a model, or a model and its parameters. In many important problems, these distributions are impossible to obtain in analytical form. An alternative is to generate their approximations by Monte Carlo-based methods like Markov chain Mon...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1704.02771  شماره 

صفحات  -

تاریخ انتشار 2017